Re-visiting the echo state property

نویسندگان

  • Izzet B. Yildiz
  • Herbert Jaeger
  • Stefan J. Kiebel
چکیده

An echo state network (ESN) consists of a large, randomly connected neural network, the reservoir, which is driven by an input signal and projects to output units. During training, only the connections from the reservoir to these output units are learned. A key requisite for output-only training is the echo state property (ESP), which means that the effect of initial conditions should vanish as time passes. In this paper, we use analytical examples to show that a widely used criterion for the ESP, the spectral radius of the weight matrix being smaller than unity, is not sufficient to satisfy the echo state property. We obtain these examples by investigating local bifurcation properties of the standard ESNs. Moreover, we provide new sufficient conditions for the echo state property of standard sigmoid and leaky integrator ESNs. We furthermore suggest an improved technical definition of the echo state property, and discuss what practicians should (and should not) observe when they optimize their reservoirs for specific tasks.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Echo State Property Linked to an Input: Exploring a Fundamental Characteristic of Recurrent Neural Networks

The echo state property is a key for the design and training of recurrent neural networks within the paradigm of reservoir computing. In intuitive terms, this is a passivity condition: a network having this property, when driven by an input signal, will become entrained by the input and develop an internal response signal. This excited internal dynamics can be seen as a high-dimensional, nonlin...

متن کامل

A local Echo State Property through the largest Lyapunov exponent

Echo State Networks are efficient time-series predictors, which highly depend on the value of the spectral radius of the reservoir connectivity matrix. Based on recent results on the mean field theory of driven random recurrent neural networks, enabling the computation of the largest Lyapunov exponent of an ESN, we develop a cheap algorithm to establish a local and operational version of the Ec...

متن کامل

Randomness and isometries in echo state networks and compressed sensing

Although largely different concepts, echo state networks and compressed sensing models both rely on collections of random weights; as the reservoir dynamics for echo state networks, and the sensing coefficients in compressed sensing. Several methods for generating the random matrices and metrics to indicate desirable performance are well-studied in compressed sensing, but less so for echo state...

متن کامل

Comparison of Three-Dimensional Double-Echo Steady-State Sequence with Routine Two-Dimensional Sequence in the Depiction of Knee Cartilage

Introduction: There are some routine two-dimensional sequences, including short tau inversion recovery (STIR), T2-weighted fast-spin echo (T2W-FSE), and proton-density fast spin-echo for diagnosing osteoarthritis and lesions of the knee cartilage. However, these sequences have some disadvantages, such as long scan time, inadequate spatial resolution, and suboptimal tis...

متن کامل

Deep-ESN: A Multiple Projection-encoding Hierarchical Reservoir Computing Framework

As an efficient recurrent neural network (RNN) model, reservoir computing (RC) models, such as Echo State Networks, have attracted widespread attention in the last decade. However, while they have had great success with time series data [1], [2], many time series have a multiscale structure, which a single-hidden-layer RC model may have difficulty capturing. In this paper, we propose a novel hi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural networks : the official journal of the International Neural Network Society

دوره 35  شماره 

صفحات  -

تاریخ انتشار 2012